31 research outputs found

    A Roadmap for HEP Software and Computing R&D for the 2020s

    Get PDF
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade.Peer reviewe

    IML Machine Learning Workshop

    No full text
    Deep Convolutional Neural Networks (CNNs) have been widely applied in computer vision to solve complex problems in image recognition and analysis. In recent years many efforts have emerged to extend the use of this technology to HEP applications, including the Convolutional Visual Network (CVN), our implementation for identification of neutrino events. In this presentation I will describe the core concepts of CNNs, the details of our particular implementation in the Caffe framework and our application to identify NOvA events. NOvA is a long baseline neutrino experiment whose main goal is the measurement of neutrino oscillations. This relies on the accurate identification and reconstruction of the neutrino flavor in the interactions we observe. In 2016 the NOvA experiment released results for the observation of oscillations in the ν μ → ν e channel, the first HEP result employing CNNs. I will also discuss our approach at event identification on NOvA as well as recent developments in the application of CNNs for particle tagging at NOvA, event identification at DUNE and other ongoing work

    Barium daughter tagging using single molecule fluorescence imaging

    No full text
    <p>A robust observation of neutrinoless double beta decay is currently the only known method to determine the Majorana nature of the neutrino. The detection of the single barium ion produced as a result of the double beta decay of xenon 136 would enable a new class of ultra-low background neutrinoless double beta decay experiments. However, despite more than 20 years of R&D, a credible method to collect and identify individual barium ions in bulk xenon has remained elusive.</p> <p>We will present a recent milestone in barium tagging R&D: single barium dication resolution using the technique of single molecule fluorescence imaging (SMFI). This R&D adapts techniques from biochemistry and microscopy to yield a novel technology with potential to extend the sensitivity of neutrinoless double beta decay searches. Individual ions are resolved with high statistical significance and with super-resolution on the nanometer scale. We will present on recent developments and current status.</p

    Muon Energy Reconstruction Through the Multiple Scattering Method in the NOvA Detectors

    No full text
    University of Minnesota M.S. thesis. June 2013. Major: Physics. Advisor: Alec Habig. 1 computer file (PDF); v, 54 pages.Neutrino energy measurements are a crucial component in the experimental study of neutrino oscillations. These measurements are done through the reconstruction of neutrino interactions and energy measurements of their products. This thesis presents the development of a technique to reconstruct the energy of muons from neutrino interactions in the NOvA experiment. This is achieved through the understanding of muon multiple scattering within the NOvA detectors. This technique is particularly useful for estimating energies of muons which escape the detector

    HEP Software Foundation Community White Paper Working Group - Training, Staffing and Careers

    No full text
    The rapid evolution of technology and the parallel increasing complexity of algorithmic analysis in HEP requires developers to acquire a much larger portfolio of programming skills. Young researchers graduating from universities worldwide currently do not receive adequate preparation in the very diverse fields of modern computing to respond to growing needs of the most advanced experimental challenges. There is a growing consensus in the HEP community on the need for training programmes to bring researchers up to date with new software technologies, in particular in the domains of concurrent programming and artificial intelligence. We review some of the initiatives under way for introducing new training programmes and highlight some of the issues that need to be taken into account for these to be successful

    Machine Learning in High Energy Physics Community White Paper

    Get PDF
    peer reviewedMachine learning is an important research area in particle physics, beginning with applications to high-level physics analysis in the 1990s and 2000s, followed by an explosion of applications in particle and event identification and reconstruction in the 2010s. In this document we discuss promising future research and development areas in machine learning in particle physics with a roadmap for their implementation, software and hardware resource requirements, collaborative initiatives with the data science community, academia and industry, and training the particle physics community in data science. The main objective of the document is to connect and motivate these areas of research and development with the physics drivers of the High-Luminosity Large Hadron Collider and future neutrino experiments and identify the resource needs for their implementation. Additionally we identify areas where collaboration with external communities will be of great benefit

    A Roadmap for HEP Software and Computing R&D for the 2020s

    No full text
    Particle physics has an ambitious and broad experimental programme for the coming decades. This programme requires large investments in detector hardware, either to build new facilities and experiments, or to upgrade existing ones. Similarly, it requires commensurate investment in the R&D of software to acquire, manage, process, and analyse the shear amounts of data to be recorded. In planning for the HL-LHC in particular, it is critical that all of the collaborating stakeholders agree on the software goals and priorities, and that the efforts complement each other. In this spirit, this white paper describes the R&D activities required to prepare for this software upgrade

    DUNE Offline Computing Conceptual Design Report

    No full text
    This document describes Offline Software and Computing for the Deep Underground Neutrino Experiment (DUNE) experiment, in particular, the conceptual design of the offline computing needed to accomplish its physics goals. Our emphasis in this document is the development of the computing infrastructure needed to acquire, catalog, reconstruct, simulate and analyze the data from the DUNE experiment and its prototypes. In this effort, we concentrate on developing the tools and systems thatfacilitate the development and deployment of advanced algorithms. Rather than prescribing particular algorithms, our goal is to provide resources that are flexible and accessible enough to support creative software solutions as HEP computing evolves and to provide computing that achieves the physics goals of the DUNE experiment
    corecore